Neural Symbolic Machines: Learning Semantic Parsers on Freebase with Weak Supervision (Short Version)

نویسندگان

  • Chen Liang
  • Jonathan Berant
  • Quoc Le
  • Kenneth D. Forbus
  • Ni Lao
چکیده

Extending the success of deep neural networks to natural language understanding and symbolic reasoning requires complex operations and external memory. Recent neural program induction approaches have attempted to address this problem, but are typically limited to differentiable memory, and consequently cannot scale beyond small synthetic tasks. In this work, we propose the Manager-ProgrammerComputer framework, which integrates neural networks with non-differentiable memory to support abstract, scalable and precise operations through a friendly neural computer interface. Specifically, we introduce a Neural Symbolic Machine, which contains a sequence-to-sequence neural "programmer", and a nondifferentiable "computer" that is a Lisp interpreter with code assist. To successfully apply REINFORCE for training, we augment it with approximate gold programs found by an iterative maximum likelihood training process. NSM is able to learn a semantic parser from weak supervision over a large knowledge base. It achieves new state-of-the-art performance on WEBQUESTIONSSP, a challenging semantic parsing dataset. Compared to previous approaches, NSM is end-to-end, therefore does not rely on feature engineering or domain specific knowledge.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Neural Symbolic Machines: Learning Semantic Parsers on Freebase with Weak Supervision

Extending the success of deep neural networks to high level tasks like natural language understanding and symbolic reasoning requires program induction and learning with weak supervision. Recent neural program induction approaches have either used primitive computation component like Turing machine or differentiable operations and memory trained by backpropagation. In this work, we propose the ...

متن کامل

Neural Symbolic Machines: Learning Semantic Parsers on Freebase with Weak Supervision

Harnessing the statistical power of neural networks to perform language understanding and symbolic reasoning is difficult, when it requires executing efficient discrete operations against a large knowledge-base. In this work, we introduce a Neural Symbolic Machine (NSM), which contains (a) a neural “programmer”, i.e., a sequence-to-sequence model that maps language utterances to programs and ut...

متن کامل

Weakly Supervised Training of Semantic Parsers

We present a method for training a semantic parser using only a knowledge base and an unlabeled text corpus, without any individually annotated sentences. Our key observation is that multiple forms of weak supervision can be combined to train an accurate semantic parser: semantic supervision from a knowledge base, and syntactic supervision from dependencyparsed sentences. We apply our approach ...

متن کامل

Response-based Learning for Machine Translation of Open-domain Database Queries

Response-based learning allows to adapt a statistical machine translation (SMT) system to an extrinsic task by extracting supervision signals from task-specific feedback. In this paper, we elicit response signals for SMT adaptation by executing semantic parses of translated queries against the Freebase database. The challenge of our work lies in scaling semantic parsers to the lexical diversity...

متن کامل

A Semantic Loss Function for Deep Learning Under Weak Supervision∗

This paper develops a novel methodology for using symbolic knowledge in deep learning. We define a semantic loss function that bridges between neural output vectors and logical constraints. This loss function captures how close the neural network is to satisfying the constraints on its output. An experimental evaluation shows that our semantic loss function effectively guides the learner to ach...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1612.01197  شماره 

صفحات  -

تاریخ انتشار 2016